# Dynamic whole word masking
Herbert Base Cased
HerBERT is a Polish pre-trained language model based on the BERT architecture, trained using dynamic whole word masking and sentence structure objectives.
Large Language Model Other
H
allegro
84.18k
17
Herbert Large Cased
HerBERT is a Polish pre-trained language model based on the BERT architecture, trained using dynamic whole word masking and sentence structure objectives.
Large Language Model Other
H
allegro
1,272
6
Featured Recommended AI Models